Mobile edge computing (MEC) has produced incredible outcomes in the context of computationally intensive mobile applications by offloading computation to a neighboring server to limit the energy usage of user equipment (UE). However, choosing a pool of application components to offload in addition to the volume of data transfer along with the latency in communication is an intricate issue. In this article, we introduce a novel energy-efficient offloading scheme based on deep neural networks. The proposed scheme trains an intelligent decision-making model that picks a robust pool of application components. The selection is based on factors such as the remaining UE battery power, network conditions, the volume of data transfer, required energy by the application components, postponements in communication, and computational load. We have designed the cost function taking all the mentioned factors, get the cost for all conceivable combinations of component offloading decisions, pick the robust decisions over an extensive dataset, and train a deep neural network as a substitute for the exhaustive computations associated. Model outcomes illustrate that our proposed scheme is proficient in the context of accuracy, root mean square error (RMSE), mean absolute error (MAE), and energy usage of UE.
Loading....